Current:Home > InvestNew York employers must now tell applicants when they encounter AI -CapitalSource
New York employers must now tell applicants when they encounter AI
View
Date:2025-04-19 10:11:46
Starting today, job-hunters in New York City will be let in on a formerly hidden part of the application process, learning whether — and how — artificial intelligence is being used to make hiring decisions.
The city's automated employment decision tools law, enacted in 2021 and scheduled to be enforced beginning July 5, positions New York City as a leader in regulating the use of AI in hiring. Other cities and states expected to gradually follow suit.
The narrowly tailored law is designed to offset potential misuses of AI in ways that could substantially affect workers' livelihoods. Specifically, it requires companies that lean on AI tools to make hiring decisions to disclose this fact to candidates. It also mandates that employers conduct annual third-party "bias audits" of the technology or software they use, in order to make public the ways in which the AI could be discriminating against certain types of candidates.
"If in fact the employers are using an automated employment decision tool (AEDT), then the employer has to commission an independent audit, publish a summary, tell applicants and employees they're using it, and give applicants the opportunity to have an accommodation and pursue an alternative selection process," Domenique Camacho Moran, an employment attorney at Farrell Fritz, told CBS MoneyWatch. "We are only talking about those tools that take the place of human people making decisions."
- Amazon is using AI to summarize customer product reviews
- Father of ChatGPT: AI could "go quite wrong"
The audits are meant to keep tabs on sometimes-controversial tools that companies themselves don't always understand. AI screening tools can save companies time — but automated decision-making has also been criticized for replicating stereotypes and disadvantaging women and people of color in some contexts.
"That's the risk in all of this, that left unchecked, humans sometimes can't even explain what data points the algorithm is picking up on. That's what was largely behind this legislation," said John Hausknecht, a professor of human resources at Cornell University's school of Industrial and Labor Relations. "It's saying let's track it, collect data, analyze it and report it, so over time, we can make changes to the regulations."
But if potential hires don't like being judged by AI, their ability to opt out is limited. The law specifies that, while an AI screening disclosure "must include instructions for how an individual can request an alternative selection process or a reasonable accommodation under other laws, if available," the hiring company isn't required to actually use a different screening process.
Replacing human decisions
The law penalizes firms that fail to comply with it. First violations are subject to a $500 fine, with subsequent offenses carrying up to $1,500 fines.
Importantly, the scope of the law is very narrow.
"It's the very first law that's specifically calling out automated decision employment tools and regulating those specifically," Littler Mendelson employment attorney Niloy Ray told CBS MoneyWatch. "This is narrowly focused on the use of AI in hiring or promoting employees but not any other employment lifecycle decisions.
Just using an AI tool isn't enough to mandate disclosure: It must have a direct effect on hiring outcomes in order for the law to apply.
"We are only talking about those tools that take the place of humans making decisions," Camacho Moran said. "If you have an AI tool that runs through 1,000 applications and says, 'these are the top 20 candidates,' that is clearly a tool that falls within the definition of an AEDT."
She continued, "if, on the other hand, the AI is designed to put people into buckets, like these candidates have relevant experience, these have relevant education – pick your criteria – that's not a tool that would fall within the AEDT definition."
In other words, if the AI flags candidates with relevant experience, but a human being views all of the applications and remains the ultimate decision-maker, the law likely wouldn't apply, according to Camacho Moran.
A hurdle for small businesses?
Some critics of the law argue that its punitive nature — and the requirement of a bias audit — is burdensome, particularly to small and midsize employers experimenting with using AI to streamline and improve hiring processes.
"This requirement just adds one more cost to the process of hiring and promoting within New York City and it is a cumbersome one," said Ray, of Littler Mendelson. "So it creates a certain amount of risk of somehow not complying because it wasn't crystal clear what you needed to do to comply and certainly there's the cost of compliance."
For one, Society of Human Resource Management chief of staff and head of public affairs Emily M. Dickens, objects to the fines.
"It was a good faith attempt to try to assign some regulatory guardrails around the issue that could impact some people adversely if it's not used correctly," she said. "But we should assume good intent until we see something very egregious. It's the first law of its kind and is likely to be replicated in other jurisdictions and you don't want to start with penalizing people for trying to do the right thing."
She supports the responsible use of AI in hiring processes given that many employers still struggle to recruit diverse workforces and that qualified candidates have fallen through the cracks under the human-centric approach.
"This process has been human-run for many years and we still have not solved the problem of creating more inclusive workplaces or accessing different talent and meeting the needs of firms struggling to find talent," Dickens said. "We need guardrails but we don't need overregulation at the cost of workforce innovation."
- In:
- Artificial Intelligence
- AI
- New York City
veryGood! (767)
Related
- Sarah J. Maas books explained: How to read 'ACOTAR,' 'Throne of Glass' in order.
- Opinion: Fat Bear Week debuted with a violent death. It's time to give the bears guns.
- Pizza Hut giving away 1 million Personal Pan Pizzas in October: How to get one
- NHL point projections, standings predictions: How we see 2024-25 season unfolding
- Mets have visions of grandeur, and a dynasty, with Juan Soto as major catalyst
- Jennifer Aniston Addresses the Most Shocking Rumors About Herself—And Some Are True
- Pizza Hut giving away 1 million Personal Pan Pizzas in October: How to get one
- Residents of landslide-stricken city in California to get financial help
- The Grammy nominee you need to hear: Esperanza Spalding
- Royals sweep Orioles to reach ALDS in first postseason since 2015: Highlights
Ranking
- This was the average Social Security benefit in 2004, and here's what it is now
- Dunkin' announces Halloween menu which includes Munchkins Bucket, other seasonal offerings
- Helene death toll hits 200 one week after landfall; 1M without power: Live updates
- TikTok personality ‘Mr. Prada’ charged in the killing of a Louisiana therapist
- Which apps offer encrypted messaging? How to switch and what to know after feds’ warning
- The Grammys’ voting body is more diverse, with 66% new members. What does it mean for the awards?
- Padres sweep Braves to set up NLDS showdown vs. rival Dodgers: Highlights
- Dakota Fanning opens up about the pitfalls of child stardom, adapting Paris Hilton's memoir
Recommendation
Selena Gomez engaged to Benny Blanco after 1 year together: 'Forever begins now'
Ron Hale, General Hospital Star, Dead at 78
Wendy Williams breaks silence on Diddy: 'It's just so horrible'
Video shows mules bringing resources to Helene victims in areas unreachable by vehicles
Current, future North Carolina governor’s challenge of power
Mayorkas warns FEMA doesn’t have enough funding to last through hurricane season
The US could see shortages and higher retail prices if a dockworkers strike drags on
Mayorkas warns FEMA doesn’t have enough funding to last through hurricane season